Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Fingerprint positioning method based on measurement report signal clustering
Haiyong ZHANG, Xianjin FANG, Enwan ZHANG, Baoyu LI, Chao PENG, Jianxiang MU
Journal of Computer Applications    2023, 43 (12): 3947-3954.   DOI: 10.11772/j.issn.1001-9081.2023010005
Abstract159)   HTML4)    PDF (2357KB)(60)       Save

Aiming at the problems of low positioning precision and efficiency of fingerprint positioning methods based on Weighted K-Nearest Neighbor (WKNN) and machine learning algorithms, a fingerprint positioning method based on Measurement Report (MR) signal clustering was proposed. Firstly, MR signals were divided into three attributes: indoor, road and outdoor. Then, by using the Geographic Information System (GIS) information, the grids were divided into building, road and outdoor sub-regions, and MR data with different attributes were placed in the sub-regions with corresponding attributes. Finally, with the help of K-Means clustering algorithm, MR signals in the grid were clustered and analyzed to create virtual sub-regions under the sub-region, and WKNN algorithm was used to match MR test samples. Besides, the average positioning accuracy was calculated by using the Euclidean distance, and the positioning performance of the proposed method was tested by some MR data in the production environment. Experimental results show that the proportion of 50 m positioning error of the proposed method is 71.21%, which is 2.64 percentage points higher than that of WKNN algorithm, and the average positioning error of the proposed method is 44.73 m, which is 7.60 m lower than that of WKNN algorithm. It can be seen that the proposed method has good positioning precision and efficiency, and can meet the positioning requirements of MR data in the production environment.

Table and Figures | Reference | Related Articles | Metrics
Cross-chain privacy protection scheme of consortium blockchain based on improved notary mechanism
Xiaohan GUO, Zhongyuan YAO, Yong ZHANG, Shangkun GUO, Chao WANG, Xueming SI
Journal of Computer Applications    2023, 43 (10): 3028-3037.   DOI: 10.11772/j.issn.1001-9081.2022111641
Abstract234)   HTML18)    PDF (3640KB)(218)       Save

Cross-chain interaction of consortium blockchain not only enhances the function of the application of consortium blockchain, but also expands the scope of usage of the application, so that it is of great significance to the application promotion and industrial development of consortium blockchain. However, the cross-chain interaction of consortium blockchain still has the privacy disclosure problems of user identity and asset transaction information at present, which has become a major factor hindering the wide application of the cross-chain interaction technology of consortium blockchain. In view of the above problems, a cross-chain privacy protection scheme for consortium blockchain assets based on improved notary mechanism was proposed. In the scheme, a hash locking mechanism was introduced at the contract layer to improve the traditional single-signature notary cross-chain method at first, so as to reduce the risk of the traditional notary mechanism centralizing and doing evil. Then, the characteristics of homomorphic encryption were used to realize the usability and invisibility of transaction assets under the premise of ensuring the legitimacy of transactions. At the same time, the identity-based cryptographic algorithm of multi-Key Generation Center (KGC) mode was used to protect the user identity privacy at the network layer. The theoretical analysis and experimental results show that the proposed scheme has a good privacy protection effect on the user identity information and asset information in the cross-chain interaction of consortium blockchain, and this scheme has lower overhead in signature and verification than other similar schemes.

Table and Figures | Reference | Related Articles | Metrics
Public transportation epidemic monitoring system based on edge computing
Huiwen XIA, Zhongyu ZHAO, Zhuoer WANG, Qingyong ZHANG, Feng PENG
Journal of Computer Applications    2022, 42 (7): 2132-2138.   DOI: 10.11772/j.issn.1001-9081.2021050727
Abstract258)   HTML10)    PDF (1577KB)(126)       Save

In view of the existing monitoring system’s inability to cope with the problems of cross-infection and traceability difficulties in the epidemic environment, a design scheme for a public transportation detection system based on edge computing was proposed. Firstly, a graph database was established to store passengers and ride information, and at the same time a dual database model was used to prevent the blockage caused by building index, thereby achieving the balance between insertion efficiency and search efficiency. Then, in the extraction of vehicle and human image information, the HSV (Hue Saturation Value) color space was used to preprocess the image, and a three-dimensional space model of face was established to improve the recognition accuracy of the neural network. When the object wore a mask, the feature point information was able to be regressed through the obvious nose tip feature points, lower jaw feature points, and unobstructed nose bridge feature points. Finally, k-hop search was used to find close contacts quickly. In the feature comparison test, the correct rates of this model are 99.44% and 99.23% on BioID dataset and PubFig dataset, respectively, and the false negative rates of the model on the two datasets are both less than 0.01%. In the graph search efficiency test, there is no big difference between the graph database and the relational database when searching at a shallow level. When the search level becomes deeper, the graph database is more efficient. After verifying the theoretical feasibility, the actual environment of buses and bus stops was simulated. In the test, the proposed system has the recognition accuracy of 99.98%, and the average recognition time of about 21 ms, which meets the requirements of epidemic monitoring. The proposed system design can meet the special needs of public safety during the epidemic period, and can realize the functions of person recognition, route recording, and potential contact search, which can effectively ensure public transportation safety.

Table and Figures | Reference | Related Articles | Metrics
Harris hawks optimization algorithm based on chemotaxis correction
Cheng ZHU, Xuhua PAN, Yong ZHANG
Journal of Computer Applications    2022, 42 (4): 1186-1193.   DOI: 10.11772/j.issn.1001-9081.2021071244
Abstract298)   HTML16)    PDF (786KB)(103)       Save

Focused on the disadvantages of slow convergence and easy to fall into local optimum of Harris Hawks Optimization (HHO) algorithm, an improved HHO algorithm called Chemotaxis Correction HHO (CC-HHO) algorithm was proposed. Firstly, the state of convergence curve was identified by calculating the rate of decline and change weight of the optimal solution. Secondly, the CC mechanism of the Bacterial Foraging Optimization (BFO) algorithm was introduced into the local search stage to improve the accuracy of optimization. Thirdly, the law of energy consumption was integrated into the updating process of the escape energy factor and the jump distance to balance the exploration and exploitation. Fourthly, elite selection for different combinations of optimal solution and sub-optimal solution was used to improve the universality of global search of the algorithm. Finally, when the search was falling into local optimum, the escape energy was disturbed to realize the forced jumping out. The performance of the improved algorithm was tested by ten benchmark functions. The results show that the search accuracy of CC-HHO algorithm on unimodal functions is better than those of Gravitational Search Algorithm (GSA), Particle Swarm Optimization (PSO) algorithm, Whale Optimization Algorithm (WOA) and other four improved HHO algorithms for more than ten orders of magnitude; there is also more than one order of magnitude superiority on multimodal functions; on the premise that search stability is improved by more than 10% on average, the proposed algorithm has faster convergence speed significantly than the above-mentioned several comparative optimization algorithms with more obvious convergence trend. Experimental results show that CC-HHO algorithm effectively improves the efficiency and robustness of the original algorithm.

Table and Figures | Reference | Related Articles | Metrics
Neighborhood decision tree construction algorithm based on variable-precision neighborhood equivalent granules
Xin XIE, Xianyong ZHANG, Xuanye WANG, Pengfei TANG
Journal of Computer Applications    2022, 42 (2): 382-388.   DOI: 10.11772/j.issn.1001-9081.2021071168
Abstract311)   HTML21)    PDF (541KB)(113)       Save

Aiming at the shortcomings such as information loss and poor effect of the existing decision tree algorithms for continuous data classification, a Neighborhood Decision Tree (NDT) construction algorithm was proposed. Firstly, the variable-precision neighborhood equivalent granules on the neighborhood decision information system were mined, and the related properties were discussed. Secondly, the neighborhood Gini index measure was constructed based on the variable-precision neighborhood equivalent granules to measure the uncertainty of the neighborhood decision information system. Finally, the neighborhood Gini index measure was used to induce the tree node selection conditions, and the variable-precision neighborhood equivalent granules were used as the tree splitting rules to construct NDT. Experimental results on UCI datasets show that the accuracy of NDT algorithm is generally improved by about 20 percentage points compared with those of Iterative Dichotomiser 3 (ID3) algorithm, Classification And Regression Tree (CART) algorithm, C4.5 algorithm and combining Information Gain and Gini Index (IGGI) algorithm, indicating that the proposed NDT algorithm is effective.

Table and Figures | Reference | Related Articles | Metrics
Network interconnection model based on trusted computing
LIU Yibo YIN Xiaochuan GAO Peiyong ZHANG Yibo
Journal of Computer Applications    2014, 34 (7): 1936-1940.   DOI: 10.11772/j.issn.1001-9081.2014.07.1936
Abstract213)      PDF (767KB)(521)       Save

Problem of intranet security is almost birth with network interconnection, especially when the demand for network interconnection is booming throughout the world. The traditional technology can not achieve both security and connectivity well. In view of this,a method was put forward based on trusted computing technology. Basic idea is to build a trusted model about the network interconnection system,and the core part of this model is credible on access to the person's identity and conduct verification:first, the IBA algorithm is reformed to design an cryptographic protocol between authentication system and accessors,and the effectiveness is analyzed in two aspects of function and accuracy; second,an evaluation tree model is established through the analysis of the entity sustainable behavior, so the security situation of access terminals can be evaluated.At last,the evaluation method is verified through an experiment.

Reference | Related Articles | Metrics
LI Zuoyong ZHANG Xiaoli WANG Jiayang ZHANG Zhengjian
Journal of Computer Applications    2014, 34 (6): 1641-1644.   DOI: 10.11772/j.issn.1001-9081.2014.06.1641
Abstract244)      PDF (564KB)(327)       Save

Aiming at the limitations of easily falling into local minimum and poor stability in simple Monkey-King Genetic Algorithm (MKGA), a MKGA by Immune Evolutionary Hybridized (MKGAIEH) was proposed. MKGAIEH divided the total population into several sub-groups. In order to make full use of the best individual (monkey-king) information of total population, the Immune Evolutionary Algorithm (IEA) was introduced to iterative calculation. In addition, for the other individuals in the sub-groups, the crossover and mutation operations were performed on the monkey-kings of sub-groups and total population. As local searches of all sub-groups were completed, the solutions of sub-groups were mixed again. As the iteration proceeds, this strategy combined the global information exchange with local search is not only to avoid the premature convergence, but also to approximate the global optimal solution with a higher accuracy. Comparison experiments on 6 test functions using MKGAIEH, MKGA, Improved MKGA (IMKGA), Bee Evolutionary Genetic Algorithm (BEGA), Algorithm of Shuffled Frog Leaping based on Immune Evolutionary Particle Swarm Optimization (IEPSOSFLA), and Common climbing Operator Genetic Algorithm (COGA) were given. The results show that the MKGAIEH can find the global optimal solutions for all 6 test functions, and the mean values and standard deviation accuracy of 5 test functions achieve the minimums with improving several orders of magnitude than those of the comparison algorithms. Therefore, MKGAIEH has the optimal searching ability and the stability all the better.

Reference | Related Articles | Metrics
Fast collision detection algorithm based on image space
YU Haijun MA Chunyong ZHANG Tao CHEN Ge
Journal of Computer Applications    2013, 33 (02): 530-533.   DOI: 10.3724/SP.J.1087.2013.00530
Abstract1137)      PDF (653KB)(397)       Save
In order to meet the high requirements of real-time collision detection in increasingly complex virtual environment, a fast collision detection algorithm based on image space was proposed. It made efficiently use of the Graphics Processing Unit (GPU). Based on the hierarchical binary tree and the collision detection between Oriented Bounding Boxes (OBB), the algorithm could quickly eliminate disjoint bumps of the virtual scene. With the potential collision set, the efficiency of the algorithm has a significantly improvement on the basis of RECODE algorithm. The experimental results show that the algorithm achieves good results, and has a higher efficiency, especially in a highly complex virtual environment.
Related Articles | Metrics
Password multimodality method in financial transactions
DAI Yong ZHANG Weijing SUN Guangwu
Journal of Computer Applications    2013, 33 (01): 135-137.   DOI: 10.3724/SP.J.1087.2013.00135
Abstract959)      PDF (516KB)(538)       Save
In financial transactions, some safety and reliability problems exist in the client authentication system with single keyboard password mode. To solve these problems, the password multimodality method was proposed. Modal sensors accessed the password codeword information and transmitted its normalized result. The formatted code information was pre-processed in property and then classified by its attributes. After that, the multimode passwords were fused through sharing public units. For a certain M bits password, if each bit had N kinds of possible modals, the password theft rate was 1/(10MCN×MM). In this multimode input system, the keyboard password and black box handwriting are the two default modals, and the application results demonstrate that the proposed method realizes the disordered blending input of multimodal password codeword. At M=6, the password theft rate is 1/(106C2×66), and the method types and difficulty of cracking those passwords increase with the number of modals. The password input system's safety, reliability and other performances are significantly better than those with only one modal.
Reference | Related Articles | Metrics
Real-time data stream analysis and entire process quality monitoring based on plant information
BIAN Xiao-yong ZHANG Xiao-long YU Hai
Journal of Computer Applications    2012, 32 (10): 2935-2939.   DOI: 10.3724/SP.J.1087.2012.02935
Abstract981)      PDF (793KB)(516)       Save
This paper proposed a solution to do research on real-time data stream analyzing and entire process quality tracing based on PI (Plant information) in order to solve these problems that the production information was blocked and product quality was unable to be traced in the steel production. The proposed solution focused on real-time data stream partition and process monitoring, and presented statistical monitoring methods based on Statistical Quality Control (SQC) charts and process capability indices. Furthermore, a product technique and quality monitoring system was developed. The applied results indicate the implementation of real-time data stream analysis and product quality monitoring based on PI can efficiently monitor production process quality, the identification and improvement of key production technology as well.
Reference | Related Articles | Metrics
Deep Web query interface schema matching based on matching degree and semantic similarity
FENG Yong ZHANG Yang
Journal of Computer Applications    2012, 32 (06): 1688-1691.   DOI: 10.3724/SP.J.1087.2012.01688
Abstract1040)      PDF (620KB)(447)       Save
Query interface schema matching is a key step in Deep Web data integration. Dual Correlated Mining (DCM) is able to make full use of association mining method to solve the problems of complex interface schema matching. There are some problems about DCM, such as inefficiency and inaccuracy in matching. Therefore, a new method based on matching degree and semantic similarity was presented in this paper to solve the problems. Firstly, the method used correlation matrix to save the association relationship among attributes; and then, matching degree was applied to calculate the degree of correlation between attributes; at last, semantic similarity was used to ensure the accuracy of final results. The experimental results on BAMM data sets of University of Illinois show that the proposed method has higher precision and efficiency than DCM and improved DCM, and indicate that the method can deal with the query interface schema matching problems very well.
Related Articles | Metrics
Concept similarity computation method based on edge weighting between concepts
FENG Yong ZHANG Yang
Journal of Computer Applications    2012, 32 (01): 202-205.   DOI: 10.3724/SP.J.1087.2012.00202
Abstract1104)      PDF (613KB)(589)       Save
The traditional distance-based similarity calculation method was described. Concerning that the method of distance calculation does not contain sufficient semantic information, this paper proposed an improved method which used WordNet and edge weighting information between the concepts to measure the similarity. It considered the level of depth and density of concepts in corpus, i.e. the semantic richness of concept. Using this method, the authors can solve the semantic similarity calculation issues and make the calculation of similarity among concepts easy. The experimental results show that, the proposed method has a 0.9109 correlation with the benchmark data set-Rubenstein concept pairs. Compared with the classical method, the proposed method has higher accuracy.
Reference | Related Articles | Metrics
Shuffled frog leaping algorithm based on immune evolutionary particle swarm optimization
LI Zuo-yong ZHANG Zheng-jian YU Chun-xue
Journal of Computer Applications    2011, 31 (12): 3288-3291.  
Abstract928)      PDF (583KB)(493)       Save
A new shuffled frog leaping algorithm based on immune evolutionary particle swarm optimization was proposed in order to avoid premature convergence and to improve the precision of solution by using basic Shuffled Frog Leaping Algorithm (SFLA). The proposed algorithm integrated the global searching idea in the Particle Swarm Optimization (PSO) into SFLA, to pursue the information of two optimal solutions in the sub-swarm and the whole-swarm simultaneously, so as to search thoroughly near by the space gap of the worst solution, and also integrated the immune evolutionary algorithm into SFLA making immune evolutionary iterative computation to the optimal solution in the whole-swarm, so as to use the information of optimal solution fully. This algorithm can not only get free from trapping into local optimum and be close to the global optimal solution with higher precision, but also speeds up the convergence. Calculation results show that the Immune Evolutionary Particle Swarm Optimization-Shuffled Frog Leaping Algorithm (IEPSO-SFLA) has better optimal searching ability and stability as well as faster convergence than those of basic SFLA.
Related Articles | Metrics
Lesion area segmentation in leukoaraiosis's magnetic resonance image based on C-V model
ZHENG Xing-hua YANG Yong ZHANG Wen ZHU Ying-jun XU Wei-dong LOU Min
Journal of Computer Applications    2011, 31 (10): 2757-2759.   DOI: 10.3724/SP.J.1087.2011.02757
Abstract1495)      PDF (651KB)(658)       Save
Concerning that the lesion areas of leukoaraiosis in Magnetic Resonance (MR) image present hyper intense signal on T 2 flair sequence, a level set segmentation method based on C-V model was proposed. First, the C-V model was improved to avoid the re-initialization; second, the Otsu threshold method was used for image's pre-segmentation, and then the image's pre-segmentation result was directly used as the initial contour for the improved C-V model; finally, the segmentation result was obtained by curve evolution. The results show that the proposed segmentation method can get better separation effects, and realize fast auto-segmentation. It has certain application value for clinical diagnosis and prognosis on leukoaraiosis.
Related Articles | Metrics
Limited feedback precoding for multiuser MIMO systems based on double codebook
FU Hong-liang TAO Yong ZHANG Yuan
Journal of Computer Applications    2011, 31 (09): 2325-2328.   DOI: 10.3724/SP.J.1087.2011.02325
Abstract1406)      PDF (579KB)(430)       Save
Concerning the problem of performance loss due to limited feedback in multiuser MIMO downlink systems, a new limited feedback precoding for multiuser Multiple Input Multiple Output (MIMO) systems based on double codebook was proposed. The maximum SINR criteria was used for selecting optimal codeword from the Grassmannian codebook and perturbation codeword at the receiving, and feedback the Grassmannian precoding codeword index and perturbation codeword index to the transmitter, then the perturbation codeword was used at the transmitter to get optimal capacity, and compensating for the capacity performance loss due to the limited feedback. The simulation results show that the proposed method ensures Bit Error Rate (BER) performance and the cost of the feedback link, and the system throughput is improved effectively.
Related Articles | Metrics
Evaluation on information transmission ability of command information system network
Xin WANG Pei-yang YAO Xiang-xiang ZHOU Jie-yong ZHANG
Journal of Computer Applications    2011, 31 (08): 2033-2036.   DOI: 10.3724/SP.J.1087.2011.02033
Abstract1336)      PDF (630KB)(920)       Save
The information transmission ability of command information system was analyzed from the angle of uncertainty. The command information system network was divided into physical layer and logical layer, and relationships between information transmission and the layers were expounded. The effective working probability of nodes and links, time delay, logical links and physical links were taken into account. The information quantity was used to measure the uncertainty of information transmission, and then the information transmission ability of command information system was educed. The experiment was designed using combat command relationship, and influences of these factors above on information transmission ability were reflected. The experimental results show that the evaluation method take the demand of connectivity, timeliness and correctness in information transmission into consideration.
Reference | Related Articles | Metrics
Accurate property weighted K-means clustering algorithm based on information entropy
YUAN Fuyong ZHANG Xiaocai LUO Sibiao
Journal of Computer Applications    2011, 31 (06): 1675-1677.   DOI: 10.3724/SP.J.1087.2011.01675
Abstract1587)      PDF (462KB)(675)       Save
Concerning the initial clustering center generation and the data similarity judgment basis of the traditional K-means algorithm, the paper proposed an accurate property weighted K-means clustering algorithm based on information entropy to further improve the clustering accuracy. First, property weights were determined by using entroy method to correct the Euclidean distance. And then, high-quality initial clustering center was chosen by comparing the empowering target cost function of the initial clusters for more accurate and more stable clustering. Finally, the algorithm was implemented in Matlab. The experimental results show that the algorithm accuracy and stability are significantly higher than the traditional K-means algorithm.
Related Articles | Metrics
Extraction of Chinese term based on Chi-square test
Wen-min HU Ting-ting HE Yong ZHANG
Journal of Computer Applications   
Abstract1655)            Save
Discovering the term has very important applications in Chinese information processing and language learning. A method for the extraction of Chinese term based on Chi-square test was proposed. First, download Web documents and build a corpus, then prime words were extracted by using the F-MI parameter improved by mutual-information,while combined words were extracted by the Chi-square test with the help of decomposition of prime string. The experiments show that the algorithm can effectively improve the precision in the extraction of Chinese term.
Related Articles | Metrics
Approximately duplicated records examining method and its application in ETL of data warehouse
Yong Zhang;;
Journal of Computer Applications   
Abstract1766)      PDF (552KB)(1133)       Save
Examining and eliminating approximately duplicated records is one of main problems needed to solve for data cleaning and improving data quality. The position-coding technology to ETL of data warehouse was introduced,a novel examining algorithm named Position-Coding Method(PCM) of approximately duplicated records was presented.The algorithm was applied to Chinese character set, as well as Western character set. Experiment comparison with the previous work indicates that the method is effective.
Related Articles | Metrics